6,689 research outputs found

    Default Estimation, Correlated Defaults, and Expert Information

    Get PDF
    Capital allocation decisions are made on the basis of an assessment of creditworthiness. Default is a rare event for most segments of a bank's portfolio and data information can be minimal. Inference about default rates is essential for efficient capital allocation, for risk management and for compliance with the requirements of the Basel II rules on capital standards for banks. Expert information is crucial in inference about defaults. A Bayesian approach is proposed and illustrated using prior distributions assessed from industry experts. A maximum entropy approach is used to represent expert information. The binomial model, most common in applications, is extended to allow correlated defaults yet remain consistent with Basel II. The application shows that probabilistic information can be elicited from experts and econometric methods can be useful even when data information is sparse.

    Heteroskedasticity-Autocorrelation Robust Standard Errors Using the Bartlett Kernel without Truncation

    Get PDF
    In this paper we analyze heteroskedasticity-autocorrelation (HAC) robust tests constructed using the Bartlett kernel without truncation. We show that while such an HAC estimator is not consistent, asymptotically valid testing is still possible. We show that tests using the Bartlett kernel without truncation are exactly equivalent to recent HAC robust tests proposed by Kiefer, Vogelsang and Bunzel (2000, Econometrica, 68, pp 695-714).

    Default Estimation for Low-Default Portfolios

    Get PDF
    The problem in default probability estimation for low-default portfolios is that there is little relevant historical data information. No amount of data processing can fix this problem. More information is required. Incorporating expert opinion formally is an attractive option.

    The Maximum Entropy Distribution for Stochastically Ordered Random Variables with Fixed Marginals

    Get PDF
    Stochastically ordered random variables with given marginal distributions are combined into a joint distribution preserving the ordering and the marginals using a maximum entropy formulation. A closed-form expression is obtained. An application is in default estimation for different portfolio segments, where priors on the individual default probabilities are available and the stochastic ordering is agreeable to separate experts. The ME formulation allows an efficiency improvement over separate analyses.

    Can the Arrow of Time be understood from Quantum Cosmology?

    Full text link
    I address the question whether the origin of the observed arrow of time can be derived from quantum cosmology. After a general discussion of entropy in cosmology and some numerical estimates, I give a brief introduction into quantum geometrodynamics and argue that this may provide a sufficient framework for studying this question. I then show that a natural boundary condition of low initial entropy can be imposed on the universal wave function. The arrow of time is then correlated with the size of the Universe and emerges from an increasing amount of decoherence due to entanglement with unobserved degrees of freedom. Remarks are also made concerning the arrow of time in multiverse pictures and scenarios motivated by dark energy.Comment: 14 pages, to appear in "The Arrow of Time", ed. by L. Mersini-Houghton and R. Vaa

    The Smooth Colonel Meets the Reverend

    Get PDF
    Kernel smoothing techniques have attracted much attention and some notoriety in recent years. The attention is well deserved as kernel methods free researchers from having to impose rigid parametric structure on their data. The notoriety arises from the fact that the amount of smoothing (i.e., local averaging) that is appropriate for the problem at hand is under the control of the researcher. In this paper we provide a deeper understanding of kernel smoothing methods for discrete data by leveraging the unexplored links between hierarchical Bayesmodels and kernelmethods for discrete processes. A number of potentially useful results are thereby obtained, including bounds on when kernel smoothing can be expected to dominate non-smooth (e.g., parametric) approaches in mean squared error and suggestions for thinking about the appropriate amount of smoothing.

    Evidence of non-Markovian behavior in the process of bank rating migrations

    Get PDF
    This paper estimates transition matrices for the ratings on financial insti-tutions, using an unusually informative data set. We show that the process of rating migration exhibits significant non-Markovian behavior, in the sense that the transition intensities are affected by macroeconomic and bank spe- cific variables. We illustrate how the use of a continuous time framework may improve the estimation of the transition probabilities. However, the time homogeneity assumption, frequently done in economic applications, does not hold, even for short time intervals. Thus, the information provided by migrations alone is not enough to forecast the future behavior of ratings. The stage of the business cycle should be taken into account, and individual characteristics of banks must be considered as well.Financial institutions; macroeconomic variables; capitaliza- tion; supervision; transition intensities. Classification JEL: C4; E44; G21; G23; G38.

    A Simulation Estimator for Testing the Time Homogeneity of Credit Rating Transition

    Get PDF
    The measurement of credit quality is at the heart of the models designed to assess the reserves and capital needed to support the risks of both individual credits and portfolios of credit instruments. A popular specification for credit- rating transitions is the simple, time-homogeneous Markov model. While the Markov specification cannot really describe processes in the long run, it may be useful for adequately describing short-run changes in portfolio risk. In this specification, the entire stochastic process can be characterized in terms of estimated transition probabilities. However, the simple homogeneous Markovian transition framework is restrictive. We propose a test of the null hypotheses of time-homogeneity that can be performed on the sorts of data often reported. We apply the tests to 4 data sets, on commercial paper, sovereign debt, municipal bonds and S&P Corporates. The results indicate that commercial paper looks Markovian on a 30-day time scale for up to 6 months; sovereign debt also looks Markovian (perhaps due to a small sample size); municipals are well-modeled by the Markov specification for up to 5 years, but could probably benefit from frequent updating of the estimated transition matrix or from more sophisticated modeling, and S&P Corporate ratings are approximately Markov over 3 transitions but not 4.

    Robust Model Selection in Dynamic Models with an Application to Comparing Predictive Accuracy

    Get PDF
    A model selection procedure based on a general criterion function, with an example of the Kullback-Leibler Information Criterion (KLIC) using quasi-likelihood functions, is considered for dynamic non-nested models. We propose a robust test which generalizes Lien and Vuong's (1987) test with a Heteroscadasticity/Autocorrelation Consistent (HAC) variance estimator. We use the fixed-b asymptotics developed in Kiefer and Vogelsang (2005) to improve the asymptotic approximation to the sampling distribution of the test statistic. The fixed-b approach is compared with a bootstrap method and the standard normal approximation in Monte Carlo simulations. The fixed-b asymptotics and the bootstrap method are found to be markedly superior to the standard normal approximation. An empirical application for foreign exchange rate forecasting models is presented.
    • 

    corecore